This page last changed on Feb 25, 2009 by jshvse.

Senior Project Interim Self-Assessment

This document is intended as a guide for the senior project team to assess its performance in a number of dimensions.  You need not answer each question in detail, rather, use the questions as a guide for the kinds of items to assess.  Add items you feel are appropriate.  

This self-assessment will be one of multiple elements that your faculty coach uses to arrive at an assessment of the team's performance for this first term.  The other elements that the faculty coach will use include: direct observation of the team, team peer evaluations, reviews by other faculty during the interim project presentation, sponsor evaluation.  These self-assessments will also be used as part of the SE program's accreditation effort.

To complete this self-assessment the team should carefully consider each of the questions and provide an honest evaluation of the team's performance.  Your faculty coach will inform you when this self-assessment is due and how to deliver it.

Team: Wirox

Project: SQL Wiki Plugin

Sponsor: Xerox


Product

Did the team prepare all the documentation artifacts requested by your faculty coach and sponsor?  Were these documents carefully inspected prior to delivery?  How would you assess the quality of the document artifacts?

Yes, everything is a living artifact and is available on the team wiki. Therefore the documents are available at any time to the sponsor. Although the sponsor has access to all of our artifacts through the wiki the key artifacts for them are the Product Plan and the Product Backlog. Since the group worked on the artifacts together, as well as had them reviewed and accepted by both the sponsor and our faculty coach they are believed to be of high quality.

How well did the team elicit the requirements?  Are the requirements fully specified at this point?  What approaches were used to elicit the requirements?  Were key requirements missed?  What methodology was used to document and validate the project requirements?

We used a requirements workshop to elicit the initial set of requirements. Our requirements are not fully specified, which is why we have adopted Scrum, an agile methodology that lets us and the sponsor reevaluate the requirements at the end of each sprint. At this point we do not believe that any key high level requirements are missed, but the details are still uncertain for future user stories. The requirements are embodied in our user stories and through evaluation and review are refined over time making sure that we are implementing what the sponsor wants.

Did the team explore the entire design space before arriving at a final design?  Have there been many errors found in the design?  Was it necessary to make major changes to any part of the design?  What were the reasons for the change?  Do you have a complete design at this point?

Initially when we began working on Bob Swift's SQL plugin we started by having re factoring sessions to separate out functionality and make the plugin more extensible and testable. At the moment we do not have a final design, but we reevaluate the design at the start of each user story implementation. This lets us know if we need to modify or refine the design in order to implement added the user story. We never have a final design because we are using the Extreme Programming approach of not adding functionality to the design before it is scheduled thereby keeping design simple by only handling the functionality required.

How has the development and implementation progressed?  What percentage of the product do you estimate is complete at this point?  Is the team providing the documentation within the implementation artifacts?

We have had consistent implementation progress. Our velocity was the same for our first two sprints and we are currently in our third. Using our estimations we currently have completed around a third of the user stories and a quarter of story points. The percentage of the product that we estimate is complete at this point is about 35%. The original documentation for the plugin is still in existence, and we have not supplemented the existing documentation for our added functionality. We are optomistic that as we progress into future sprints our velocity will increase and we will be able to finish in time.

What is the team's testing strategy?  Has the team developed a test plan?  Is the team performing unit testing?  Is the team using any test frameworks, such as JUnit? What are the testing results to date?  Were any major defects found during system test?

For our testing purposes we have been using JUnit and HTTPUnit. We aim to have unit tests for all of our functionality with around 80% line coverage. Currently we have 82% line coverage.  For test reports we have been using Emma. There have been no major defects found during system tests. Since the Confluence framework is very large and complicated there are places where we would have to mock functionality to get full test coverage. To mitigate the risk of having untested code in these areas we have extracted as much functionality out of these areas as possible.

Products need to be designed within guidelines and constraints appropriate for each project.  It is also important to consider the impacts of the products that are designed.  In the following categories discuss the constraints and impacts that have a bearing on your project.  Note that there may be one or two categories that have no bearing on your project but your project is probably affected by almost all of these.

  • Economic issues - The purpose of the plugin is to be used in a commercial environment, therefore if the plugin was to be used in a destructive manner there could be negative consequences.
  • Environmental issues- N/A
  • Social issues - N/A
  • Political issues - N/A
  • Ethical issues - This plugin may be used to handle sensitive information. Our requirements do not include the addition of any security functionality. Plugin could be used to manipulate and destroy sensitive data.
  • Health and safety -
  • Manufacturability - N/A
  • Sustainability - N/A

What industry and engineering standards must your project adhere to?  Were these new standards that the team had to learn?  Did your sponsor provide you support for understanding these standards?  Did you have to educate your sponsor about these standards?

The team has decided to follow many but not all of the extreme programming rules. Some of the rules that we have not followed are the test first approach to coding.There are no specific industry standards that we had to adhere to, but Confluence uses a specific way to communicate to plugins. Therefore we have to adhere to standards that will allow us to create a plugin that works on Confluence. This was the only standard we had to adhere to.

Hawker comment:  You used SQL and Confuence Wiki Plugin development guidelines

Process

What is your process methodology?  Has this been clearly outlined to your sponsor and received the sponsor's approval?  How is the process documented?

The process methodology that we have chosen is a Scrum with XP engineering. This choice has been clearly outlined to the sponsor and is actually a result of their use of it in their own development. They approve of and also help us in the execution and adherence to process and it has been a fruitful experience. The process is documented in the project plan and is also clearly evident in the artifacts that are maintained on our project wiki page.

Was there a large requirement to learn the problem domain?  What approach was used to gain domain expertise?  Did your sponsor provide adequate support? What forms of support did you receive?

There was a moderate need to learn the problem domain. Team members had good experience with database development but had to brush up in areas pertinent to the development of plugins and web development. This was not a large issue and progress has been steady in holding a grasp on the necessary knowledge for the problem domain. To gain the domain experience we basically just experimented within the existing plugin and also did individual research in areas where more knowledge was needed. The sponsor did provide adequate support and this was typically in sharing useful resources and also personal experiences with the domain.

What mechanisms is the team using to track project progress?  How well has the team tracked its project progress?  How often do these artifacts get updated on the department project website?

The mechanisms the team is using to track project progress are dominantly scrum artifacts. We use the product backlog, sprint backlogs, various sprint charts and individual user stories and task break downs to figure out where we are in the project in various levels of granularity. These artifacts are updated whenever necessary (e.g. completion of a task for a user story or user story completion for product backlog update) and these updates are reflected on the department project website whenever the team project wiki page is exported and updated on the department site. The team has tracked its project progress very well and charts and tables relevant to development are typically updated daily providing the most current snapshot of the project status. These are tightly intertwined into our methodology choice and have been very helpful in providing direction for the project.

Is the team conducting effective meetings?  What can be changed to make the team meetings more productive?

The team conducts scrum meetings each development day for a maximum of 15 minutes. These meetings are effective in getting each team member on the same page with various work that is being done on the project by each individual (or more often pairs). This also helps in highlighting the current risks that may materialize and also current status and impediments within the sprint backlog. These meetings are very effective and time boxing them helps to ensure efficiency. We also have team meetings whenever there are more important design decisions being made for a user story so that everyone is up to speed on project changes and can help to identify the best solution drawing from everyone's experience. There is not much that can be changed to make the meetings more productive it is just important that we continue to adhere to the 15 minute maximum for scrum meetings and make sure everyone is present for larger project decisions.

Hawker comment:  also sprint reflection meetings and sprint planning meetings

Has the team met all project milestones to date?  Which milestones, if any, were missed or were met ahead of schedule?  What contributed to this schedule changes? What will the team do differently to ensure that future milestones are met?

So far the team has met all project milestones mandated by the SE department. Milestones from the department are always completed ahead of or at the scheduled time and this is a result of reviewing the SE senior project schedule on a weekly basis and allocating tasks appropriately. Other project milestones lie in the user stories taken on for each sprint. These are not necessarily completed as during a sprint we have the ability to move some stories to the back burner or take on more user stories depending upon that sprint's velocity. So far we have done well in keeping consistent velocity and meeting sprint goals.

Was the team required to adopt new technologies?  What were these technologies?  What approach did the team use for selecting the appropriate technology for the project?  Did the sponsor provide any support for learning these technologies?  How well did the team ramp up on the new technologies and begin to apply them effectively?

The original plugin was written in Java, so we had no choice but to continue using Java. We used servlets for communication between the front end and the database manager. We've used velocity to generate html of the contents of each macro, and we use xml for information passing for requests. We wanted to provide an interactive experience so we have used java script and JQuery. For development we have deployed our own Confluence wiki to run our plugin on. Maven is used as the build manager. Xerox made it clear that we can develop using MySQL, but the end product needed to be able to work with Oracle 10g since that's what they use. Most of these technologies have been used by at least one of the teammates. So adopting new technologies has been easy because we have either shared experience/knowledge with fellow team members or refreshed our memories through individual research. The one technology we did not know anything about was Confluence. However, the sponsor has been there to help when help has been requested. The team ramped up well with the new technologies and has not run into any huge issues.

How well has the team maintained quality control over the project artifacts?  Have all artifacts been reviewed for adherence to quality standards?  What is the review process used by the team?

There is no real formal quality control over project artifacts. Typically as things change we review documents and check them for validity. The review process generally involves team review sessions where the whole team reviews an artifact together. This is an area that we could possibly improve on by introducing a formal review process.  We have been able to develop well with artifacts we have created, and therefore our general consensus is that they are adequate for our needs and of high quality.

Has the team had any issues with configuration management?  How were these problems solved?  What percentage of project artifacts is under configuration control?

We have had problems concerning the interoperability of Netbeans and Eclipse. When switching from Netbeans to Eclipse several small issues arose. We have not had too many issues with this, but we just need to be careful in the future when developing in Netbeans and Eclipse.

Hawker comment:  would you consider wiki edit history as "change management" for non-code artifacts?

What is the set of metrics that the team is tracking?  Has the team gathered these metrics on a consistent basis?  What has the team learned from the review of these metrics?

The set of metrics that the team is tracking are coverage, velocity, test pass/fail, hour burn down and point burn up. The team has gathered these metrics on a consistent basis since most have direct implications on user stories will be chosen for a given sprint and also how we are progressing within a sprint and the project as a whole. Most of these metrics live on the project wiki page but the test coverage and pass/fail are done through Emma. From the review of the metrics we continually "tweak" our sprints to take on the appropriate amount of work and also see if we are on track for completing the user stories that have been chosen for a sprint. We review our metrics for each sprint post-mortem and they are very important within our project.

Hawker comment:  what about the effort tracking metrics that the SE department requires? 

Communication and Interaction

How well has the team been communicating project progress to the sponsor?  What regular communication does the team have with the sponsor?  Has the team been maintaining this communication to the satisfaction of the sponsor?  Were any adjustments needed in the communication over time?  Were these changes initiated by the team or the sponsor?

The team has been communicating in a consistent and effective manner with the sponsor throughout the project. Primary communications between the team and the sponsor during sprints is done by the Scrum Master in order to establish a consistent point of contact. Whenever necessary during the sprint (impediments, important design choices, user story clarifications, questions) there is e-mail contact between the team and the sponsor and phone contact is possible in more time sensitive situations. There is also a Sprint review meeting at the start of every sprint which relays the progress during the last sprint and also establishes the goals for the next. These meetings are conducted in person and also by teleconference and internet meetings (depending on the sponsors availability and location). There have been no communication adjustments so far and the team believes that both parties are happy with the level of communication.

Hawker comment:  Sharing artifacts on the wiki also aids in effective communication.

Did the team need to provide technical input to the sponsor?  How well did the team educate the customer in these areas?  What mechanism did the team use?

The team did not need to provide technical input to the sponsor. The sponsor is typically well versed in the issues at hand and any technical input to the sponsor is based on different design and development choices or discoveries. Technical discussions about the project are done through e-mail during the sprints and face-to-face at sprint meetings. The sponsor are technical people themselves and there is no real domain knowledge gap between the team and sponsor. Often the team has asked the sponsor for input on potential approaches to technical issues since they are experienced in the domain.

Is this an effective team?  What has been contributing to and detracting from the team's effectiveness?  What are the team's weak points?  What are the team's strong points?  What changes can the team make for next term that will make it more effective?

Thus far Team Wirox has been an effective team. We all knew each other prior to the project and this helps in many ways. We see each other on a consistent basis regardless of the project so it is no hard to coordinate activities and track people down. There is also another level of accountability since all members are friends and senior project attendance and collaboration is not an issue. The team's weak points, if any, could lie in the ability to become distracted by each other but it is better to enjoy senior project then dread it anyway, progress is still strong. The teams strong points lie in the consistent and stable nature of the team and its development times and also the good team chemistry which helps in many aspects of the chosen process and team work in general. Any changes for next quarter will likely be in changing development times (hopefully larger blocks of time spread out).

What mechanism does the team use to communicate with the faculty coach?  Has communication with the coach been effective?  Are there any trouble spots with the faculty coach communications?  What can the team change for next term to make their communication to the faculty coach more effective?  What can the faculty coach change to make his or her interaction with the team more effective?

There are no real issues with the team communication with the faculty coach. Communication has been effective and aside from normal "check in" times on Tuesdays and Thursdays the coach is generally accessible and helpful with any issues or questions the team may have. There are no changes that need to happen for the next quarter as the team to faculty coach communication has gone well and fulfilled the needs of the senior project team without impeding process.

Hawker comment:  the structure and focus of the daily stand-up meeting provides me an efficient way to stay up to date.

Has the team needed to interact with department staff personnel, i.e. the office staff or Kurt?  Has this been handled in a professional manner?  Were there any problems with these interactions?

The team has had to interact on multiple occasions with department staff personnel. Some interaction with CS and IT system administrators was brief in searching for an Oracle database to use but primary interaction with staff personnel has been with Kurt. This is typically done by e-mail or face-to-face and is generally helpful. Some issues arise in timeliness or communication about the status of requests but we generally get what we need in adequate time. We plan ahead for most requests that need to go through Kurt but sometimes we need to go back to him for missing information or confirmation that progress on requests is being made.

Does the team have a complete website with all project artifacts stored and up-to-date on the software engineering department webserver, i.e. linus.se.rit.edu?  How often are entries on the webserver updated?

The team tries to have an up-to-date complete website of project artifacts on the SE webserver. Our primary locale of project information is on the team wiki page but we try to regularly export this to a website for the SE department. We envisioned that this would be done everyday but typically we forget and only do it when large changes or milestones occur. This is one change that the team will try to make next quarter, getting the information from the daily updated wiki page to the SE mandated website on a consistent basis (every other day).

How well has the team made presentations to the sponsor and faculty coach?  Was the interim project presentation done in a professional manner?  What can be done to improve the team's presentations?

Presentations by the team made to the sponsor occur at every sprint planning meeting. The current progress is described and typically a short demo is shown. We believe that these are done in a professional manner and to date there have been no issues with successfully conveying project information to the sponsor or faculty coach. 

The team presentation for the interim report was done well but some elements were ignored by the team. The team chose to wear "business casual" and have only two of the team members present material. Although we omitted these seeing as they were not directly related to the project, they were noticed in the presentation and may have distracted some from the content. The presentation overall was considered to be of good quality but future presentations by the team may leave nothing for evaluators to nitpick.

In the future during more formal presentations the team will pay more attention to the non-content related elements such as dress and full team participation.

Hawker comment:  I would not worry about full team participation in the presentation.  Choose the best mechanisms to communicate, and remove distractions (for example, have people not presenting step clearly aside).

How well has the team worked with other senior project teams, coordinating access to lab space and equipment, sharing experiences and ideas, etc.?

There have not been many issues in working with other senior project teams. Resource contention has not been an issue and we have setup a good schedule for development times in which we are able to have access to whatever space and equipment that we need. We have done well in sharing experiences, progress and ideas with other senior project teams. There are no problems in this area and cross team communication is successful and at times helpful.

Achieving Customer Satisfaction

In the team's opinion has the work accomplished to date satisfied the project sponsor?  Were there any weak spots in this regard?

In the team's opinion the work accomplished so far has satisfied the project sponsor. They have expressed confidence in the progress and also seem happy with the current state of the functionality and plugin in general. We are currently running into some impediments in running our plugin on Oracle (since we develop using MySQL) but constant communication with the sponsor has been beneficial in working our way through it. We believe that the sponsor and senior project team are well linked and maintain a good level of communication so that both parties are aware of the project progress and status. We do not believe there are any weak spots in this regard.

Document generated by Confluence on May 21, 2009 10:23